by CM
Posted on May 09, 2020
In this article, we will quickly explore how to save a Keras model to Google Cloud Storage. This is especially useful in case you want to deploy your model and make it accessible online.
Google Cloud Storage is a RESTful online file storage web service for storing and accessing data on Google Cloud Platform infrastructure. The service combines the performance and scalability of Google's cloud with advanced security and sharing capabilities. It is an Infrastructure as a Service (IaaS), comparable to Amazon S3 online storage service. Contrary to Google Drive and according to different service specifications, Google Cloud Storage appears to be more suitable for enterprises.
Let's jump right into creating a Cloud Storage Bucket. We need buckets as data is stored in so-called GCS buckets. These buckets can be seen as containers that hold the data. Buckets can be created in various ways: For example: (1) In the Google Cloud Storage UI, (2) using Google Colab, (3) using the Google Cloud Console, etc.
### As an example, I created a new bucket with the Cloud Console.
###Note: This is not done in Python but in the Cloud Console or Cloud SDK.
BUCKET_NAME ='my_new_bucket'
PROJECT_NAME = 'project-12345'
MODEL_NAME = 'my_model'
MODEL_VERSION = 'v1.0'
PYTHON_VERSION = '3.7'
REGION = 'us-central1'
Second, we jump into the Python code and start importing our dependency.
#For saving the model
import tensorflow as tf
from tensorflow.keras.models import Sequential
#For Authentification in GCS
import os
from google.colab import auth as google_auth
Luckily, we have created an h5 model in the last article that we will now upload to Google Cloud Storage. To prepare that, we need to do two things: A) Prepare Authentification in GCS, B) Read the model from our local disk or Colab file explorer into Python.
#Authentification
#Copy the provided Authentification code in the respective web input field.
google_auth.authenticate_user()
#Reading the model from local disk / Colab file explorer.
my_model = tf.keras.models.load_model('my_model.h5')
Last step is just uploading the model to our new GCS bucket.
BUCKET_NAME = 'my_new_bucket'
FOLDER_NAME = 'my_new_folder'
GS_PATH = 'gs://' + BUCKET_NAME + '/' + FOLDER_NAME
export_path = tf.saved_model.save(my_model, GS_PATH)
Done. We will now have successfully saved our model on GCS. You can make it publicly availabe by changing the Access Permission Rights in GCS.
Make ML Public! #Google Cloud Storage